Skip to content

Conversation

albertz
Copy link
Member

@albertz albertz commented Sep 15, 2021

Support of accumulation of dynamic sizes (DimensionTag.dyn_size_ext) within a recurrent context (control_flow_context). E.g. when the dyn size (seq lengths) is of shape [B] inside the loop, and this dyn size changes in every iteration (i.e. control_flow_context is set to the rec loop), then the accumulated dyn size outside the loop would have shape [T,B].

Further, when some layer inside the loop has such a dynamic shape [B,T_per_frame], where T_per_frame is the dyn size as mentioned before, this means that we cannot use the current TensorArray logic, which expects that elements always have the same shape in every frame. This PR also extends that logic to support that.

This was a request by #635 and also makes sense for #589, #579 (and other related commits for #391).

@albertz albertz force-pushed the albert-accum-loop-dyn-size branch 3 times, most recently from bd72b69 to 3fe1cac Compare September 18, 2021 21:09
@albertz albertz marked this pull request as ready for review September 21, 2021 13:28
@albertz albertz requested a review from a team as a code owner September 21, 2021 13:28
@albertz
Copy link
Member Author

albertz commented Sep 21, 2021

Finally. Again more effort than expected. This still needs to be cleaned up. Maybe some aspects will be separated. Like the DimensionTag.declare_same_as changes.

@albertz albertz force-pushed the albert-accum-loop-dyn-size branch 2 times, most recently from 70672a3 to fb7f54d Compare September 21, 2021 14:09
@JackTemaki

This comment has been minimized.

@albertz

This comment has been minimized.

@albertz albertz force-pushed the albert-accum-loop-dyn-size branch from 3706730 to 3d17449 Compare September 21, 2021 20:05
@albertz albertz merged commit 6f7f622 into master Sep 21, 2021
@albertz albertz deleted the albert-accum-loop-dyn-size branch September 21, 2021 20:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants